Preconditioned Methods for Sampling Multivariate Gaussian Distributions

نویسندگان

  • Edmond Chow
  • Yousef Saad
چکیده

A common problem in statistics is to compute sample vectors from a multivariate Gaussian distribution with zero mean and a given covariance matrix A. A canonical approach to the problem is to compute vectors of the form y = Sz, where S is the Cholesky factor or square root of A, and z is a standard normal vector. When A is large, such an approach becomes computationally expensive. This paper considers preconditioned Krylov subspace methods to perform this task. The Lanczos process provides a means to approximate Az for any vector z from an m-dimensional Krylov subspace. The main contribution of this paper is to show how to enhance the convergence of the process via preconditioning. Both incomplete Cholesky preconditioners and approximate inverse preconditioners are discussed. It is argued that the latter class of preconditioners has an advantage in the context of sampling. Numerical tests are performed with stationary covariance matrices used to model Gaussian processes and illustrate the dramatic improvement in computation time that preconditioning can confer.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Preconditioned Krylov Subspace Methods for Sampling Multivariate Gaussian Distributions

A common problem in statistics is to compute sample vectors from a multivariate Gaussian distribution with zero mean and a given covariance matrix A. A canonical approach to the problem is to compute vectors of the form y = Sz, where S is the Cholesky factor or square root of A, and z is a standard normal vector. When A is large, such an approach becomes computationally expensive. This paper co...

متن کامل

Slice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method

Abstract The shrinking rank method is a variation of slice sampling that is efficient at sampling from multivariate distributions with highly correlated parameters. It requires that the gradient of the logdensity be computable. At each individual step, it approximates the current slice with a Gaussian occupying a shrinking-dimension subspace. The dimension of the approximation is shrunk orthogo...

متن کامل

Multivariate Gaussian Simulation Outside Arbitrary Ellipsoids

Methods for simulation from multivariate Gaussian distributions restricted to be from outside an arbitrary ellipsoidal region are often needed in applications. A standard rejection algorithm that draws a sample from a multivariate Gaussian distribution and accepts it if it is outside the ellipsoid is often employed: however, this is computationally inefficient if the probability of that ellipso...

متن کامل

A new rejection sampling method for truncated multivariate Gaussian random variables restricted to convex sets

Statistical researchers have shown increasing interest in generating truncated multivariate normal distributions. In this paper, we only assume that the acceptance region is convex and we focus on rejection sampling. We propose a new algorithm that outperforms crude rejection method for the simulation of truncated multivariate Gaussian random variables. The proposed algorithm is based on a gene...

متن کامل

Exact Hamiltonian Monte Carlo for Truncated Multivariate Gaussians

We present a Hamiltonian Monte Carlo algorithm to sample from multivariate Gaussian distributions in which the target space is constrained by linear and quadratic inequalities or products thereof. The Hamiltonian equations of motion can be integrated exactly and there are no parameters to tune. The algorithm mixes faster and is more efficient than Gibbs sampling. The runtime depends on the numb...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013